🇬🇧 en it 🇮🇹

Markov process noun

  • (probability theory) Any stochastic process for which the conditional probability distribution of future states depends only on the current state (and not on past states).
processo di Markov, processo markoviano
Wiktionary Links